CARPool Covariance: Fast, unbiased covariance estimation for large-scale structure observables

نویسندگان

چکیده

The covariance matrix $\boldsymbol{\Sigma}$ of non-linear clustering statistics that are measured in current and upcoming surveys is fundamental interest for comparing cosmological theory data a crucial ingredient the likelihood approximations underlying widely used parameter inference forecasting methods. extreme number simulations needed to estimate sufficient accuracy poses severe challenge. Approximating using inexpensive but biased surrogates introduces model error with respect full simulations, especially regime structure growth. To address this problem we develop generalization Convergence Acceleration by Regression Pooling (CARPool) combine small fast obtain low-noise estimates unbiased construction. Our numerical examples use CARPool GADGET-III $N$-body computed COmoving Lagrangian (COLA). Even at challenging redshift $z=0.5$, find variance reductions least $\mathcal{O}(10^1)$ up $\mathcal{O}(10^4)$ elements matter power spectrum on scales $8.9\times 10^{-3}<k_\mathrm{max} <1.0$ $h {\rm Mpc^{-1}}$. We demonstrate comparable performance bispectrum, correlation function probability density field. compare eigenvalues, likelihoods, Fisher matrices standard sample estimators generally considerable improvement except cases where $\Sigma$ severely ill-conditioned.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-linear shrinkage estimation of large-scale structure covariance

In many astrophysical settings, covariance matrices of large data sets have to be determined empirically from a finite number of mock realizations. The resulting noise degrades inference and precludes it completely if there are fewer realizations than data points. This work applies a recently proposed non-linear shrinkage estimator of covariance to a realistic example from large-scale structure...

متن کامل

Large-scale l0 sparse inverse covariance estimation

There has been significant interest in sparse inverse covariance estimation in areas such as statistics, machine learning, and signal processing. In this problem, the sparse inverse of a covariance matrix of a multivariate normal distribution is estimated. A Penalised LogLikelihood (PLL) optimisation problem is solved to obtain the matrix estimator, where the penalty is responsible for inducing...

متن کامل

Bayesian Hierarchical Model for Large-Scale Covariance Matrix Estimation

Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our ap...

متن کامل

Fast covariance estimation for sparse functional data

Smoothing of noisy sample covariances is an important component in functional data analysis. We propose a novel covariance smoothing method based on penalized splines and associated software. The proposed method is a bivariate spline smoother that is designed for covariance smoothing and can be used for sparse functional or longitudinal data. We propose a fast algorithm for covariance smoothing...

متن کامل

Estimation of Large Covariance Matrices

This paper considers estimating a covariance matrix of p variables from n observations by either banding or tapering the sample covariance matrix, or estimating a banded version of the inverse of the covariance. We show that these estimates are consistent in the operator norm as long as (logp)/n→ 0, and obtain explicit rates. The results are uniform over some fairly natural well-conditioned fam...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Monthly Notices of the Royal Astronomical Society

سال: 2021

ISSN: ['0035-8711', '1365-8711', '1365-2966']

DOI: https://doi.org/10.1093/mnras/stab3097